Chernoff Bounds
نویسنده
چکیده
If m = 2, i.e., P = (p, 1 − p) and Q = (q, 1 − q), we also write DKL(p‖q). The Kullback-Leibler divergence provides a measure of distance between the distributions P and Q: it represents the expected loss of efficiency if we encode an m-letter alphabet with distribution P with a code that is optimal for distribution Q. We can now state the general form of the Chernoff Bound: Theorem 1.1. Let X1, . . . , Xn be independent random variables with Xi ∈ {0, 1} and Pr[Xi = 1] = p, for i = 1, . . . n. Set X := ∑n i=1Xi. Then, for any t ∈ [0, 1− p], we have Pr[X ≥ (p+ t)n] ≤ e−DKL(p+t‖p)n.
منابع مشابه
Chernoff bounds on pairwise error probabilities of space-time codes
We derive Chernoff bounds on pairwise error probabilities of coherent and noncoherent space-time signaling schemes. First, general Chernoff bound expressions are derived for a correlated Ricean fading channel and correlated additive Gaussian noise. Then, we specialize the obtained results to the cases of space-timeseparable noise, white noise, and uncorrelated fading. We derive approximate Cher...
متن کاملGeometric Applications of Chernoff-type Estimates
In this paper we present a probabilistic approach to some geometric problems in asymptotic convex geometry. The aim of this paper is to demonstrate that the well known Chernoff bounds from probability theory can be used in a geometric context for a very broad spectrum of problems, and lead to new and improved results. We begin by briefly describing Chernoff bounds, and the way we will use them....
متن کاملI, '- ' '-.... '. ' 'g S0:.~:~f0ff;::f 0;0AC
The Chernoff bounds provide very accurate information concerning the tail probabilities of most distrbhutions. in this paper we describe some relevant properties of these bounds.
متن کاملA note on the distribution of the number of prime factors of the integers
The Chernoff-Hoeffding bounds are fundamental probabilistic tools. An elementary approach is presented to obtain a Chernoff-type upper-tail bound for the number of prime factors of a random integer in {1, 2, . . . , n}. The method illustrates tail bounds in negatively-correlated settings.
متن کاملChernoff Bounds, and Some Applications
Preliminaries Before we venture into Chernoff bound, let us recall two simple bounds on the probability that a random variable deviates from the mean by a certain amount: Markov's inequality and Chebyshev's inequality. Markov's inequality only applies to non-negative random variables and gives us a bound depending on the expectation of the random variable.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015